# Multilingual Mixed Training
Llama 3 Youko 8b
A Japanese-optimized model based on Meta-Llama-3-8B, continuously pretrained on a mixed dataset of Japanese and English with 22 billion tokens
Large Language Model
Transformers Supports Multiple Languages

L
rinna
1,249
60
Bangla Llama 7b Instruct V0.1
A 7-billion-parameter Bengali large language model optimized based on the LLaMA-2 architecture, supporting instruction-following tasks
Large Language Model
Transformers Supports Multiple Languages

B
BanglaLLM
32
5
Discolm German 7b V1 AWQ
Apache-2.0
DiscoLM German 7B v1 is a 7B-parameter German language model based on the Mistral architecture, supporting both German and English, and released under the Apache-2.0 license.
Large Language Model
Transformers Supports Multiple Languages

D
TheBloke
81
4
Vietnamese Llama2 7b 40GB
Other
A Vietnamese-optimized model based on Llama2-chat 7B, significantly improving Vietnamese language processing through incremental pre-training and an efficient tokenizer
Large Language Model
Transformers Supports Multiple Languages

V
bkai-foundation-models
23
40
Koalpaca Llama 1 7b
Apache-2.0
KoAlpaca is the Korean version of the Stanford Alpaca model, combining the LLAMA architecture with Polyglot-ko technology, specifically optimized for Korean text generation tasks.
Large Language Model
Transformers Supports Multiple Languages

K
beomi
213
28
Featured Recommended AI Models